Learning interactions through hierarchical group-lasso regularization

نویسندگان

  • Michael Lim
  • Trevor Hastie
چکیده

We introduce a method for learning pairwise interactions in a manner that satisfies strong hierarchy: whenever an interaction is estimated to be nonzero, both its associated main effects are also included in the model. We motivate our approach by modeling pairwise interactions for categorical variables with arbitrary numbers of levels, and then show how we can accommodate continuous variables and mixtures thereof. Our approach allows us to dispense with explicitly applying constraints on the main effects and interactions for identifiability, which results in interpretable interaction models. We compare our method with existing approaches on both simulated and real data, including a genome wide association study, all using our R package glinternet.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sufficient Conditions for Generating Group Level Sparsity in a Robust Minimax Framework

Regularization technique has become a principled tool for statistics and machine learning research and practice. However, in most situations, these regularization terms are not well interpreted, especially on how they are related to the loss function and data. In this paper, we propose a robust minimax framework to interpret the relationship between data and regularization terms for a large cla...

متن کامل

Structured Sparsity and Generalization

We present a data dependent generalization bound for a large class of regularized algorithms which implement structured sparsity constraints. The bound can be applied to standard squared-norm regularization, the Lasso, the group Lasso, some versions of the group Lasso with overlapping groups, multiple kernel learning and other regularization schemes. In all these cases competitive results are o...

متن کامل

The Group-Lasso: `1,∞ Regularization versus `1,2 Regularization

The `1,∞ norm and the `1,2 norm are well known tools for joint regularization in Group-Lasso methods. While the `1,2 version has been studied in detail, there are still open questions regarding the uniqueness of solutions and the efficiency of algorithms for the `1,∞ variant. For the latter, we characterize the conditions for uniqueness of solutions, we present a simple test for uniqueness, and...

متن کامل

Spatial Projection of Multiple Climate Variables Using Hierarchical Multitask Learning

Future projection of climate is typically obtained by combining outputs from multiple Earth System Models (ESMs) for several climate variables such as temperature and precipitation. While IPCC has traditionally used a simple model output average, recent work has illustrated potential advantages of using a multitask learning (MTL) framework for projections of individual climate variables. In thi...

متن کامل

Consistency of the Group Lasso and Multiple Kernel Learning

We consider the least-square regression problem with regularization by a block 1-norm, i.e., a sum of Euclidean norms over spaces of dimensions larger than one. This problem, referred to as the group Lasso, extends the usual regularization by the 1-norm where all spaces have dimension one, where it is commonly referred to as the Lasso. In this paper, we study the asymptotic model consistency of...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013